1 |
Multilingual Language Model Adaptive Fine-Tuning: A Study on African Languages ...
|
|
|
|
Abstract:
Multilingual pre-trained language models (PLMs) have demonstrated impressive performance on several downstream tasks on both high resourced and low-resourced languages. However, there is still a large performance drop for languages unseen during pre-training, especially African languages. One of the most effective approaches to adapt to a new language is language adaptive fine-tuning (LAFT) -- fine-tuning a multilingual PLM on monolingual texts of a language using the same pre-training objective. However, African languages with large monolingual texts are few, and adapting to each of them individually takes large disk space and limits the cross-lingual transfer abilities of the resulting models because they have been specialized for a single language. In this paper, we perform multilingual adaptive fine-tuning (MAFT) on 17 most-resourced African languages and three other high-resource languages widely spoken on the African continent -- English, French, and Arabic to encourage cross-lingual transfer learning. ... : Accepted to AfricaNLP 2022 (non-archival) ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/2204.06487 https://dx.doi.org/10.48550/arxiv.2204.06487
|
|
BASE
|
|
Hide details
|
|
2 |
Preventing author profiling through zero-shot multilingual back-translation
|
|
|
|
In: 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP) ; https://hal.inria.fr/hal-03350906 ; 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP), Nov 2021, Punta Cana, Dominica (2021)
|
|
BASE
|
|
Show details
|
|
3 |
On the effect of normalization layers on Differentially Private training of deep Neural networks
|
|
|
|
In: https://hal.inria.fr/hal-03475600 ; 2021 (2021)
|
|
BASE
|
|
Show details
|
|
4 |
Adapting Language Models When Training on Privacy-Transformed Data
|
|
|
|
In: INTERSPEECH 2021 ; https://hal.inria.fr/hal-03189354 ; 2021 (2021)
|
|
BASE
|
|
Show details
|
|
5 |
Do Acoustic Word Embeddings Capture Phonological Similarity? An Empirical Study ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
ANEA: Distant Supervision for Low-Resource Named Entity Recognition ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Integrating Unsupervised Data Generation into Self-Supervised Neural Machine Translation for Low-Resource Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Preventing Author Profiling through Zero-Shot Multilingual Back-Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Modeling Profanity and Hate Speech in Social Media with Semantic Subspaces ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Exploring the Potential of Lexical Paraphrases for Mitigating Noise-Induced Comprehension Errors ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
On the Correlation of Context-Aware Language Models With the Intelligibility of Polish Target Words to Czech Readers
|
|
|
|
In: Front Psychol (2021)
|
|
BASE
|
|
Show details
|
|
13 |
Transfer learning and distant supervision for multilingual Transformer models: A study on African languages
|
|
|
|
In: 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) ; https://hal.inria.fr/hal-03350901 ; 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), Nov 2020, Punta Cana, Dominica (2020)
|
|
BASE
|
|
Show details
|
|
14 |
Distant supervision and noisy label learning for low resource named entity recognition: A study on Hausa and Yorùbá
|
|
|
|
In: ICLR Workshops (AfricaNLP & PML4DC 2020) ; https://hal.archives-ouvertes.fr/hal-03359111 ; ICLR Workshops (AfricaNLP & PML4DC 2020), Apr 2020, Addis Ababa, Ethiopia (2020)
|
|
BASE
|
|
Show details
|
|
15 |
Transfer Learning and Distant Supervision for Multilingual Transformer Models: A Study on African Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Rediscovering the Slavic Continuum in Representations Emerging from Neural Models of Spoken Language Identification ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
On the Interplay Between Fine-tuning and Sentence-level Probing for Linguistic Knowledge in Pre-trained Transformers ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
A Closer Look at Linguistic Knowledge in Masked Language Models: The Case of Relative Clauses in American English ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
A Closer Look at Linguistic Knowledge in Masked Language Models: The Case of Relative Clauses in American English ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|